Goto

Collaborating Authors

 Gesture Recognition


Bridging the Communication Gap: Artificial Agents Learning Sign Language through Imitation

arXiv.org Artificial Intelligence

Artificial agents, particularly humanoid robots, interact with their environment, objects, and people using cameras, actuators, and physical presence. Their communication methods are often pre-programmed, limiting their actions and interactions. Our research explores acquiring non-verbal communication skills through learning from demonstrations, with potential applications in sign language comprehension and expression. In particular, we focus on imitation learning for artificial agents, exemplified by teaching a simulated humanoid American Sign Language. We use computer vision and deep learning to extract information from videos, and reinforcement learning to enable the agent to replicate observed actions. Compared to other methods, our approach eliminates the need for additional hardware to acquire information. We demonstrate how the combination of these different techniques offers a viable way to learn sign language. Our methodology successfully teaches 5 different signs involving the upper body (i.e., arms and hands). This research paves the way for advanced communication skills in artificial agents.


A real-time Artificial Intelligence system for learning Sign Language

arXiv.org Artificial Intelligence

A primary challenge for the deaf and hearing-impaired community stems from the communication gap with the hearing society, which can greatly impact their daily lives and result in social exclusion. To foster inclusivity in society, our endeavor focuses on developing a cost-effective, resource-efficient, and open technology based on Artificial Intelligence, designed to assist people in learning and using Sign Language for communication. The analysis presented in this research paper intends to enrich the recent academic scientific literature on Sign Language solutions based on Artificial Intelligence, with a particular focus on American Sign Language (ASL). This research has yielded promising preliminary results and serves as a basis for further development.


Learning sign language could give you super vision

Daily Mail - Science & tech

Researchers have found that learning sign language can be beneficial for hearing adults too, giving them faster reaction times in their peripheral vision. Improved peripheral vision is useful in many sports and for driving, making you more alert to changes in your peripheral field of vision. The research also found that deaf adults have far better peripheral vision and reaction times than both hearing adults and hearing adults who use sign language. Researchers at the University of Sheffield have found that learning sign language can be beneficial for hearing adults too, giving them faster reaction times in their peripheral vision. The research, conducted at the University of Sheffield's Academic Unit of Opthalmology, found that adults learning a visual-spatial language such as British sign language (BSL) had a positive impact on their visual field response.